Concentration of Measure Inequalities in Information Theory, Communications and Coding

نویسندگان

  • Maxim Raginsky
  • Igal Sason
چکیده

Concentration inequalities have been the subject of exciting developments during the last two decades, and they have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices and percolation), information theory, learning theory, dynamical systems and randomized algorithms. This tutorial article is focused on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. The first part of this article introduces some classical concentration inequalities for martingales, and it also derives some recent refinements of these inequalities. The power and versatility of the martingale approach is exemplified in the context of binary hypothesis testing, codes defined on graphs and iterative decoding algorithms, and some other aspects that are related to wireless communications and coding. The second part of this article introduces the entropy method for deriving concentration inequalities for functions of many independent random variables, and it also exhibits its multiple connections to information theory. The basic ingredients of the entropy method are discussed first in conjunction with the closely related topic of logarithmic Sobolev inequalities, which are typical of the so-called functional approach to studying concentration of measure phenomena. The discussion on logarithmic Sobolev inequalities is complemented by a related viewpoint based on probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, the tutorial addresses several applications of the entropy method and related information-theoretic tools to problems in communications and coding. These include strong converses for several source and channel coding problems, empirical distributions of good channel codes with non-vanishing error probability, and an information-theoretic converse for concentration of measure.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Concentration of Measure Inequalities in Information Theory, Communications, and Coding Concentration of Measure Inequalities in Information Theory, Communications, and Coding

During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer scienc...

متن کامل

Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications

During the last two decades, concentration of measure has been a subject of various exciting developments in convex geometry, functional analysis, statistical physics, high-dimensional statistics, probability theory, information theory, communications and coding theory, computer science, and learning theory. One common theme which emerges in these fields is probabilistic stability: complicated,...

متن کامل

Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES The Martingale Approach for Concentration and Applications in Information Theory, Communications and Coding

This chapter introduces some concentration inequalities for discrete-time martingales with bounded increments, and it exemplifies some of their potential applications in information theory and related topics. The first part of this chapter introduces briefly discrete-time martingales and the Azuma-Hoeffding & McDiardmid’s inequalities which are widely used in this context. It then derives these...

متن کامل

On Refined Versions of the Azuma-Hoeffding Inequality with Applications in Information Theory

This chapter introduces some concentration inequalities for discrete-time martingales with bounded increments, and it exemplifies some of their potential applications in information theory and related topics. The first part of this chapter introduces briefly discrete-time martingales and the Azuma-Hoeffding & McDiardmid’s inequalities which are widely used in this context. It then derives these...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Foundations and Trends in Communications and Information Theory

دوره 10  شماره 

صفحات  -

تاریخ انتشار 2013